196 research outputs found

    The strong weak convergence of the quasi-EA

    Get PDF
    In this paper, we investigate the convergence of a novel simulation scheme to the target diffusion process. This scheme, the Quasi-EA, is closely related to the Exact Algorithm (EA) for diffusion processes, as it is obtained by neglecting the rejection step in EA. We prove the existence of a myopic coupling between the Quasi-EA and the diffusion. Moreover, an upper bound for the coupling probability is given. Consequently we establish the convergence of the Quasi-EA to the diffusion with respect to the total variation distance

    Multi-step Richardson-Romberg Extrapolation: Remarks on Variance Control and complexity

    Get PDF
    We propose a multi-step Richardson-Romberg extrapolation method for the computation of expectations Ef(XT)E f(X_{_T}) of a diffusion (Xt)t[0,T](X_t)_{t\in [0,T]} when the weak time discretization error induced by the Euler scheme admits an expansion at an order R2R\ge 2. The complexity of the estimator grows as R2R^2 (instead of 2R2^R) and its variance is asymptotically controlled by considering some consistent Brownian increments in the underlying Euler schemes. Some Monte carlo simulations carried with path-dependent options (lookback, barriers) which support the conjecture that their weak time discretization error also admits an expansion (in a different scale). Then an appropriate Richardson-Romberg extrapolation seems to outperform the Euler scheme with Brownian bridge.Comment: 28 pages, \`a para\^itre dans Monte Carlo Methods and Applications Journa

    Asymptotic analysis of model selection criteria for general hidden Markov models

    Get PDF
    © 2020 Elsevier B.V. The paper obtains analytical results for the asymptotic properties of Model Selection Criteria – widely used in practice – for a general family of hidden Markov models (HMMs), thereby substantially extending the related theory beyond typical ‘i.i.d.-like’ model structures and filling in an important gap in the relevant literature. In particular, we look at the Bayesian and Akaike Information Criteria (BIC and AIC) and the model evidence. In the setting of nested classes of models, we prove that BIC and the evidence are strongly consistent for HMMs (under regularity conditions), whereas AIC is not weakly consistent. Numerical experiments support our theoretical results

    Bayesian inference for partially observed stochastic differential equations driven by fractional Brownian motion

    Get PDF
    We consider continuous-time diffusion models driven by fractional Brownian motion. Observations are assumed to possess a nontrivial likelihood given the latent path. Due to the non-Markovian and high-dimensional nature of the latent path, estimating posterior expectations is computationally challenging. We present a reparameterization framework based on the Davies and Harte method for sampling stationary Gaussian processes and use it to construct a Markov chain Monte Carlo algorithm that allows computationally efficient Bayesian inference. The algorithm is based on a version of hybrid Monte Carlo simulation that delivers increased efficiency when used on the high-dimensional latent variables arising in this context. We specify the methodology on a stochastic volatility model, allowing for memory in the volatility increments through a fractional specification. The method is demonstrated on simulated data and on the S&P 500/VIX time series. In the latter case, the posterior distribution favours values of the Hurst parameter smaller than 1/2 , pointing towards medium-range dependence

    On the convergence of adaptive sequential Monte Carlo methods

    Get PDF
    In several implementations of Sequential Monte Carlo (SMC) methods it is natural and important, in terms of algorithmic efficiency, to exploit the information of the history of the samples to optimally tune their subsequent propagations. In this article we provide a carefully formulated asymptotic theory for a class of such adaptive SMC methods. The theoretical framework developed here will cover, under assumptions, several commonly used SMC algorithms [Chopin, Biometrika 89 (2002) 539–551; Jasra et al., Scand. J. Stat. 38 (2011) 1–22; Schäfer and Chopin, Stat. Comput. 23 (2013) 163–184]. There are only limited results about the theoretical underpinning of such adaptive methods: we will bridge this gap by providing a weak law of large numbers (WLLN) and a central limit theorem (CLT) for some of these algorithms. The latter seems to be the first result of its kind in the literature and provides a formal justification of algorithms used in many real data contexts [Jasra et al. (2011); Schäfer and Chopin (2013)]. We establish that for a general class of adaptive SMC algorithms [Chopin (2002)], the asymptotic variance of the estimators from the adaptive SMC method is identical to a “limiting” SMC algorithm which uses ideal proposal kernels. Our results are supported by application on a complex high-dimensional posterior distribution associated with the Navier–Stokes model, where adapting high-dimensional parameters of the proposal kernels is critical for the efficiency of the algorithm

    Sequential Monte Carlo methods for Bayesian elliptic inverse problems

    Get PDF
    In this article, we consider a Bayesian inverse problem associated to elliptic partial differential equations in two and three dimensions. This class of inverse problems is important in applications such as hydrology, but the complexity of the link function between unknown field and measurements can make it difficult to draw inference from the associated posterior. We prove that for this inverse problem a basic sequential Monte Carlo (SMC) method has a Monte Carlo rate of convergence with constants which are independent of the dimension of the discretization of the problem; indeed convergence of the SMC method is established in a function space setting. We also develop an enhancement of the SMC methods for inverse problems which were introduced in Kantas et al. (SIAM/ASA J Uncertain Quantif 2:464–489, 2014); the enhancement is designed to deal with the additional complexity of this elliptic inverse problem. The efficacy of the methodology and its desirable theoretical properties, are demonstrated for numerical examples in both two and three dimensions

    Error bounds and normalising constants for sequential monte carlo samplers in high dimensions

    Get PDF
    In this paper we develop a collection of results associated to the analysis of the sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional independent and identically distributed target probabilities. TheSMCsamplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations with respect to this law. Given a target density in d dimensions our results are concerned with d while the number of Monte Carlo samples, N, remains fixed. We deduce an explicit bound on the Monte-Carlo error for estimates derived using theSMCsampler and the exact asymptotic relative L2-error of the estimate of the normalising constant associated to the target. We also establish marginal propagation of chaos properties of the algorithm. These results are deduced when the cost of the algorithm is O(Nd2). © Applied Probability Trust 2014

    A stable manifold MCMC method for high dimensions

    Get PDF
    We combine two important recent advancements of MCMC algorithms: first, methods utilizing the intrinsic manifold structure of the parameter space; then, algorithms effective for targets in infinite-dimensions with the critical property that their mixing time is robust to mesh refinement. © 2014 Elsevier B.V

    Exact simulation of diffusions

    Get PDF
    We describe a new, surprisingly simple algorithm, that simulates exact sample paths of a class of stochastic differential equations. It involves rejection sampling and, when applicable, returns the location of the path at a random collection of time instances. The path can then be completed without further reference to the dynamics of the target process
    corecore